skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Search for: All records

Creators/Authors contains: "Kitzes, Justin"

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. Large, well described gaps exist in both what we know and what we need to know to address the biodiversity crisis. Artificial intelligence (AI) offers new potential for filling these knowledge gaps, but where the biggest and most influential gains could be made remains unclear. To date, biodiversity-related uses of AI have largely focused on tracking and monitoring of wildlife populations. Rapid progress is being made in the use of AI to build phylogenetic trees and species distribution models. However, AI also has considerable unrealized potential in the re-evaluation of important ecological questions, especially those that require the integration of disparate and inherently complex data types, such as images, video, text, audio and DNA. This Review describes the current and potential future use of AI to address seven clearly defined shortfalls in biodiversity knowledge. Recommended steps for AI-based improvements include the re-use of existing image data and the development of novel paradigms, including the collaborative generation of new testable hypotheses. The resulting expansion of biodiversity knowledge could lead to science spanning from genes to ecosystems — advances that might represent our best hope for meeting the rapidly approaching 2030 targets of the Global Biodiversity Framework. 
    more » « less
    Free, publicly-accessible full text available March 1, 2026
  2. The AudioMoth is a popular autonomous recording unit (ARU) that is widely used to record vocalizing species in the field. Despite its growing use, there have been few quantitative tests on the performance of this recorder. Such information is needed to design effective field surveys and to appropriately analyze recordings made by this device. Here, we report the results of two tests designed to evaluate the performance characteristics of the AudioMoth recorder. First, we performed indoor and outdoor pink noise playback experiments to evaluate how different device settings, orientations, mounting conditions, and housing options affect frequency response patterns. We found little variation in acoustic performance between devices and relatively little effect of placing recorders in a plastic bag for weather protection. The AudioMoth has a mostly flat on-axis response with a boost above 3 kHz, with a generally omnidirectional response that suffers from attenuation behind the recorder, an effect that is accentuated when it is mounted on a tree. Second, we performed battery life tests under a variety of recording frequencies, gain settings, environmental temperatures, and battery types. We found that standard alkaline batteries last for an average of 189 h at room temperature using a 32 kHz sample rate, and that lithium batteries can last for twice as long at freezing temperatures compared to alkaline batteries. This information will aid researchers in both collecting and analyzing recordings generated by the AudioMoth recorder. 
    more » « less
  3. Birds singing in choruses must contend with the possibility of interfering with each other's songs, but not all species will interfere with each other to the same extent due to signal partitioning. Some evidence suggests that singing birds will avoid temporal overlap only in cases where there is overlap in the frequencies their songs occupy, but the extent to which this behaviour varies according to level of frequency overlap is not yet well understood. We investigated the hypothesis that birds will increasingly avoid heterospecific temporal overlap as their frequency overlap increases by testing for a linear correlation between frequency overlap and temporal avoidance across a community of temperate eastern North American birds. We found that there was a significant correlation across the whole community and within 12 of 15 commonly occurring individual species, which supports our hypothesis and adds to the growing body of evidence that birds adjust the timing of their songs in response to frequency overlap. 
    more » « less
  4. Community science image libraries offer a massive, but largely untapped, source of observational data for phenological research. The iNaturalist platform offers a particularly rich archive, containing more than 49 million verifiable, georeferenced, open access images, encompassing seven continents and over 278,000 species. A critical limitation preventing scientists from taking full advantage of this rich data source is labor. Each image must be manually inspected and categorized by phenophase, which is both time-intensive and costly. Consequently, researchers may only be able to use a subset of the total number of images available in the database. While iNaturalist has the potential to yield enough data for high-resolution and spatially extensive studies, it requires more efficient tools for phenological data extraction. A promising solution is automation of the image annotation process using deep learning. Recent innovations in deep learning have made these open-source tools accessible to a general research audience. However, it is unknown whether deep learning tools can accurately and efficiently annotate phenophases in community science images. Here, we train a convolutional neural network (CNN) to annotate images of Alliaria petiolata into distinct phenophases from iNaturalist and compare the performance of the model with non-expert human annotators. We demonstrate that researchers can successfully employ deep learning techniques to extract phenological information from community science images. A CNN classified two-stage phenology (flowering and non-flowering) with 95.9% accuracy and classified four-stage phenology (vegetative, budding, flowering, and fruiting) with 86.4% accuracy. The overall accuracy of the CNN did not differ from humans ( p = 0.383), although performance varied across phenophases. We found that a primary challenge of using deep learning for image annotation was not related to the model itself, but instead in the quality of the community science images. Up to 4% of A. petiolata images in iNaturalist were taken from an improper distance, were physically manipulated, or were digitally altered, which limited both human and machine annotators in accurately classifying phenology. Thus, we provide a list of photography guidelines that could be included in community science platforms to inform community scientists in the best practices for creating images that facilitate phenological analysis. 
    more » « less
  5. In late December 1973, the United States enacted what some would come to call “the pitbull of environmental laws.” In the 50 years since, the formidable regulatory teeth of the Endangered Species Act (ESA) have been credited with considerable successes, obliging agencies to draw upon the best available science to protect species and habitats. Yet human pressures continue to push the planet toward extinctions on a massive scale. With that prospect looming, and with scientific understanding ever changing,Scienceinvited experts to discuss how the ESA has evolved and what its future might hold.—Brad Wible 
    more » « less
  6. In late December 1973, the United States enacted what some would come to call “the pitbull of environmental laws.” In the 50 years since, the formidable regulatory teeth of the Endangered Species Act (ESA) have been credited with considerable successes, obliging agencies to draw upon the best available science to protect species and habitats. Yet human pressures continue to push the planet toward extinctions on a massive scale. With that prospect looming, and with scientific understanding ever changing, Science invited experts to discuss how the ESA has evolved and what its future might hold. 
    more » « less
  7. Abstract Landscape‐scale bioacoustic projects have become a popular approach to biodiversity monitoring. Combining passive acoustic monitoring recordings and automated detection provides an effective means of monitoring sound‐producing species' occupancy and phenology and can lend insight into unobserved behaviours and patterns. The availability of low‐cost recording hardware has lowered barriers to large‐scale data collection, but technological barriers in data analysis remain a bottleneck for extracting biological insight from bioacoustic datasets.We provide a robust and open‐source Python toolkit for detecting and localizing biological sounds in acoustic data.OpenSoundscape provides access to automated acoustic detection, classification and localization methods through a simple and easy‐to‐use set of tools. Extensive documentation and tutorials provide step‐by‐step instructions and examples of end‐to‐end analysis of bioacoustic data. Here, we describe the functionality of this package and provide concise examples of bioacoustic analyses with OpenSoundscape.By providing an interface for bioacoustic data and methods, we hope this package will lead to increased adoption of bioacoustics methods and ultimately to enhanced insights for ecology and conservation. 
    more » « less
  8. null (Ed.)